NONPARAMETRIC EIGENVALUE-REGULARIZED PRECISION OR COVARIANCE MATRIX ESTIMATOR BY CLIFFORD LAM London School of Economics and Political Science
ثبت نشده
چکیده
We introduce nonparametric regularization of the eigenvalues of a sample covariance matrix through splitting of the data (NERCOME), and prove that NERCOME enjoys asymptotic optimal nonlinear shrinkage of eigenvalues with respect to the Frobenius norm. One advantage of NERCOME is its computational speed when the dimension is not too large. We prove that NERCOME is positive definite almost surely, as long as the true covariance matrix is so, even when the dimension is larger than the sample size. With respect to the Stein’s loss function, the inverse of our estimator is asymptotically the optimal precision matrix estimator. Asymptotic efficiency loss is defined through comparison with an ideal estimator, which assumed the knowledge of the true covariance matrix. We show that the asymptotic efficiency loss of NERCOME is almost surely 0 with a suitable split location of the data. We also show that all the aforementioned optimality holds for data with a factor structure. Our method avoids the need to first estimate any unknowns from a factor model, and directly gives the covariance or precision matrix estimator, which can be useful when factor analysis is not the ultimate goal. We compare the performance of our estimators with other methods through extensive simulations and real data analysis.
منابع مشابه
Nonparametric Eigenvalue-Regularized Precision or Covariance Matrix Estimator
We introduce nonparametric regularization of the eigenvalues of a sample covariance matrix through splitting of the data (NERCOME), and prove that NERCOME enjoys asymptotic optimal nonlinear shrinkage of eigenvalues with respect to the Frobenius norm. One advantage of NERCOME is its computational speed when the dimension is not too large. We prove that NERCOME is positive definite almost surely...
متن کاملFeasible Multivariate Nonparametric Regression Estimation Using Weak Separability
One of the main practical problems of nonparametric regression estimation is the curse of dimensionality. The curse of dimensionality arises because nonparametric regression estimates are dependent variable averages local to the point at which the regression function is to be estimated. The number of observations ‘local’ to the point of estimation decreases exponentially with the number of dime...
متن کاملتحلیل ممیز غیرپارامتریک بهبودیافته برای دستهبندی تصاویر ابرطیفی با نمونه آموزشی محدود
Feature extraction performs an important role in improving hyperspectral image classification. Compared with parametric methods, nonparametric feature extraction methods have better performance when classes have no normal distribution. Besides, these methods can extract more features than what parametric feature extraction methods do. Nonparametric feature extraction methods use nonparametric s...
متن کاملProvably Good Early Detection of Diseases using Non-Sparse Covariance-Regularized Linear Discriminant Analysis
To improve the performance of Linear Discriminant Analysis (LDA) for early detection of diseases using Electronic Health Records (EHR) data, we propose ED – a novel framework for EHR based Early Detection of Diseases on top of Covariance-Regularized LDA models. Specifically, ED employs a non-sparse inverse covariance matrix (or namely precision matrix) estimator derived from graphical lasso and...
متن کاملTitle of Dissertation : Nonparametric Quasi - likelihood in Longitudinal Data
Title of Dissertation: Nonparametric Quasi-likelihood in Longitudinal Data Analysis Xiaoping Jiang, Doctor of Philosophy, 2004 Dissertation directed by: Professor Paul J. Smith Statistics Program Department of Mathematics This dissertation proposes a nonparametric quasi-likelihood approach to estimate regression coefficients in the class of generalized linear regression models for longitudinal ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016